14 research outputs found

    Pre-HE Mentoring Programmes: Rapid Evidence Review

    Get PDF
    This is a rapid review of a sample of evaluation reports concerning pre-HE (outreach) mentoring programmes for young people. The report identified a number of key practical and implementation issues associated with the impact of mentoring programmes, including the targeting, selection and recruitment of mentees and mentors, the role of mentor training, and how mentoring is understood to deliver its objectives (either as a delivery mechanism for other programme elements, or itself the key mechanism of change). The report goes on to provide an overview of the evaluation approaches adopted across the case studies and identifies a series of evaluation challenges, many of which are likely to be common to the evaluation of other pre-HE outreach activities. The report concludes with a summary of 9 case studies, comparing mentoring programme design and implementation, intended outcomes and evaluation approach. Although primarily written for an audience considering developing or delivering a mentoring programme in this space, we hope that elements of this report, including a discussion of evaluation challenges, may have wider relevance

    Evaluation of outreach interventions for under 16 year olds. Tools and guidance for higher education providers

    Get PDF
    During 2017-18, OFFA commissioned research that aimed to understand the nature of outreach activities for under 16 year olds (which were funded through access and participation investment) and how these were evaluated. This document, developed from the research, is intended to act as a resource for pre-16 outreach practitioners and evaluators, drawing both on the data collected by this project and the wider literature around evaluation and outreach. It seeks to recognise the complexity of pre-16 outreach work and eschews a prescriptive approach in favour of establishing important principles and actions that are likely to underpin good practice. Our discussion is broadly positioned within a ‘social realist’ worldview (Archer, 2008; Pawson, 2013) that seeks to understand the fuzzy nature of the cause-and-effect relationships that exist within complex social fields, where individuals construct their own realities in reference to those around them. There is a particular focus on epistemology – the pathways to creating dependable, if contingent, knowledge – as a vehicle for making meaning from data that is usually incomplete, compromised or mediated through young people’s emergent constructions of their worlds. Fundamentally, outreach is predicated on the ability of practitioners to influence young people in a planned way, albeit that the plan will not always work for every young person in every cohort. An important element in this epistemology is that it is not concerned with finding single ‘solutions’ that exist outside time and context. Rather, it is concerned with understanding how young people are influenced by their life experiences – not ‘what works’, but what works in a given context and, importantly, why. It is only through understanding the latter element that practices can become robustly effective in the long-term and potentially transferable to other contexts. This is particularly appropriate to pre-16 outreach work due to the lengthy time lag between activity and application to higher education (HE).Office for Students (OfS

    Understanding the evaluation of access and participation outreach interventions for under 16 year olds

    Get PDF
    The project team was asked to address the following six research questions and these were used to guide the project: 1. What are the intended outcomes for current outreach interventions directed at under 16 year olds from disadvantaged backgrounds where the long-term aim is to widen access to higher education (HE)? 2. What types of outreach intervention activity or activities are institutions using in relation to intended outcomes? 3. What evaluation tools, methods and metrics are being used to measure the intended outcomes? 4. What are the perceived and actual challenges and barriers for different stakeholders to effective evaluation of long-term outreach? 5. What do different stakeholders consider most effective evaluation practice and why? 6. How valid and suitable are the evaluation tools, methods and metrics (identified through the research) that are commonly used? The project was constructed around six interlinked work packages: 1. A quantitative analysis of what higher education providers (HEPs) say about their pre-16 outreach activities (and their evaluation) in their 2017-18 access agreements (as the most recent available). 2. An online survey of HEPs to gather information about the pre-16 outreach activities delivered during the 2016-17 academic year and their evaluation, as well as the structure of their evaluation resources and challenges faced. 3. Case studies of four HEPs identified as demonstrating elements of good practice through their access agreements and the online survey, derived from telephone interviews with key staff and documentary analysis. 4. Telephone interviews with 11 third sector organisations (TSOs) to explore their practices and the evaluation of their activities, providing a counterpoint to the data collected from higher education institutions (HEIs). 5. A synthesis of the four preceding work packages to explore elements of good practice, determine a basis for assessing the quality of evaluations and highlight challenges for the sector and OFFA. 6. An invited participatory workshop for evaluators from HEPs and TSOs identified as demonstrating elements of good practice through the online survey and telephone interviews, to act as a sounding board for the emerging conclusions and recommendations.Office for Students (OfS

    Evaluation of access and participation plans: Understanding what works

    Get PDF
    We present an analysis of two current policy options to improve evaluation of access and participation work: independent external evaluation vs. in-house evaluation. Evaluation of access and participation work needs to be well-conducted, objective and widely disseminated, regardless of the outcome. Independent external evaluation is likely to provide objectivity and the right skills, but providing effective and timely feedback may be prohibitively expensive. Without support, in-house practitioner teams risk lack of objectivity and skills. Neither external nor in-house evaluation is likely to solve issues of publication bias; usage of open science principles could help. Working with academics and other experts internal to the institution could provide the skills to work well under the open science framework. Working as a sector to avoid duplication of effort is likely to get us further, faster

    Evaluation of access and participation plans: Understanding what works

    Get PDF
    We present an analysis of two current policy options to improve evaluation of access and participation work: independent external evaluation vs. in-house evaluation. Evaluation of access and participation work needs to be well-conducted, objective and widely disseminated, regardless of the outcome. Independent external evaluation is likely to provide objectivity and the right skills, but providing effective and timely feedback may be prohibitively expensive. Without support, in-house practitioner teams risk lack of objectivity and skills. Neither external nor in-house evaluation is likely to solve issues of publication bias; usage of open science principles could help. Working with academics and other experts internal to the institution could provide the skills to work well under the open science framework. Working as a sector to avoid duplication of effort is likely to get us further, faster

    Mary Esteve, The Aesthetics and Politics of the Crowd in American Literature

    No full text

    Impact evaluation with small cohorts: methodology guidance

    Get PDF
    This guidance is designed for impact evaluation that can be used with small cohorts
    corecore